Tensor Regression Meets Gaussian Processes
نویسندگان
چکیده
Low-rank tensor regression, a new model class that learns high-order correlation from data, has recently received considerable attention. At the same time, Gaussian processes (GP) are well-studied machine learning models for structure learning. In this paper, we demonstrate interesting connections between the two, especially for multi-way data analysis. We show that low-rank tensor regression is essentially learning a multi-linear kernel in Gaussian processes, and the low-rank assumption translates to the constrained Bayesian inference problem. We prove the oracle inequality and derive the average case learning curve for the equivalent GP model. Our finding implies that low-rank tensor regression, though empirically successful, is highly dependent on the eigenvalues of covariance functions as well as variable correlations.
منابع مشابه
Fast Laplace Approximation for Gaussian Processes with a Tensor Product Kernel
Gaussian processes provide a principled Bayesian framework, but direct implementations are restricted to small data sets due to the cubic time cost in the data size. In case the kernel function is expressible as a tensor product kernel and input data lies on a multidimensional grid it has been shown that the computational cost for Gaussian process regression can be reduced considerably. Tensor ...
متن کاملDoubly Decomposing Nonparametric Tensor Regression
Nonparametric extension of tensor regression is proposed. Nonlinearity in a high-dimensional tensor space is broken into simple local functions by incorporating low-rank tensor decomposition. Compared to naive nonparametric approaches, our formulation considerably improves the convergence rate of estimation while maintaining consistency with the same function class under specific conditions. To...
متن کاملProcessor-time-optimal systolic arrays
Minimizing the amount of time and number of processors needed to perform an application reduces the application's fabrication cost and operation costs. A directed acyclic graph (dag) model of algorithms is used to de ne a time-minimal schedule and a processor-time-minimal schedule. We present a technique for nding a lower bound on the number of processors needed to achieve a given schedule of a...
متن کاملLow-rank tensor approximation for high-order correlation functions of Gaussian random fields
Gaussian random fields are widely used as building blocks for modeling stochastic processes. This paper is concerned with the efficient representation of d-point correlations for such fields, which in turn enables the representation of more general stochastic processes that can be expressed as a function of one (or several) Gaussian random fields. Our representation consists of two ingredients....
متن کاملScalable Gaussian Processes with Billions of Inducing Inputs via Tensor Train Decomposition
We propose a method (TT-GP) for approximate inference in Gaussian Process (GP) models. We build on previous scalable GP research including stochastic variational inference based on inducing inputs, kernel interpolation, and structure exploiting algebra. The key idea of our method is to use Tensor Train decomposition for variational parameters, which allows us to train GPs with billions of induc...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1710.11345 شماره
صفحات -
تاریخ انتشار 2017